Training Artificial Neural Networks: Backpropagation via Nonlinear Optimization

نویسندگان

  • Jadranka Skorin-Kapov
  • Wendy Tang
چکیده

In this paper we explore different strategies to guide backpropagation algorithm used for training artificial neural networks. Two different variants of steepest descent-based backpropagation algorithm, and four different variants of conjugate gradient algorithm are tested. The variants differ whether or not the time component is used, and whether or not additional gradient information is utilized during one-dimensional optimization. Testing is performed on randomly generated data as well as on some benchmark data regarding energy prediction. Based on our test results, it appears that the most promissing backpropagation strategy is to initially use steepest descent algorithm, and then continue with conjugate gradient algorithm. The backpropagation through time strategy combined with conjugate gradients appears to be promissing as well.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Beyond Backpropagation: Using Simulated Annealing for Training Neural Networks

The vast majority of neural network research relies on a gradient algorithm, typically a variation of backpropagation, to obtain the weights of the model. Because of the enigmatic nature of complex nonlinear optimization problems, such as training artificial neural networks, this technique has often produced inconsistent and unpredictable results. To go beyond backpropagation’s typical selectio...

متن کامل

HYBRID ARTIFICIAL NEURAL NETWORKS BASED ON ACO-RPROP FOR GENERATING MULTIPLE SPECTRUM-COMPATIBLE ARTIFICIAL EARTHQUAKE RECORDS FOR SPECIFIED SITE GEOLOGY

The main objective of this paper is to use ant optimized neural networks to generate artificial earthquake records. In this regard, training accelerograms selected according to the site geology of recorder station and Wavelet Packet Transform (WPT) used to decompose these records. Then Artificial Neural Networks (ANN) optimized with Ant Colony Optimization and resilient Backpropagation algorith...

متن کامل

On Mappable Nonlinearities in Robustness Analysis - Automatic Control, IEEE Transactions on

K. Hornik, M. Stinchombe, and H. White, “Multilayer feedforward networks are universal approximators,” Neural Networks, vol. 2, pp. 359-366, 1989. A. Pinkus, N Widths in Approximation Theory. New York: SpringerVerlag, 1986. F. Girosi, “Regularization theory, radial basis functions and networks,” From Statistics to Neural Networks. Theory and Pattem Recognition Applications, V. Cherkassky, J. H....

متن کامل

Functional Approximation Using Neuro-genetic Hybrid Systems

Artificial neural networks provide a methodology for solving many types of nonlinear problems that are difficult to solve using traditional techniques. Neurogenetic hybrid systems bring together the artificial neural networks benefits and the inherent advantages of evolutionary algorithms. A functional approximation method using neuro-genetic hybrid systems is proposed in this paper. Three evol...

متن کامل

Conjugate Gradient Methods in Training Neural Networks

Training of artificial neural networks is normally a time consuming task due to iterative search imposed by the implicit nonlinearity of the network behavior. To tackle the supervised learning of multilayer feed forward neural networks, the backpropagation algorithm has been proven to be one of the most successful neural network algorithm. Although backpropagation training has proved to be effi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004